MulStepNET: stronger multi-step graph convolutional networks via multi-power adjacency matrix combination

نویسندگان

چکیده

Abstract Graph convolutional networks (GCNs) have become the de facto approaches and achieved state-of-the-art results for circumventing many real-world problems on graph-structured data. However, these are usually shallow due to over-smoothing of GCNs with layers, which limits expressive power learning graph representations. The current methods solving limitations bottlenecks high complexity parameters. Although Simple Convolution (SGC) reduces parameters, it fails distinguish feature information neighboring nodes at different distances. To tackle limits, we propose MulStepNET, a stronger multi-step network architecture, that can capture more global information, by simultaneously combining neighborhoods information. When compared existing such as GCN MixHop, MulStepNET aggregates distant distances via multi-power adjacency matrix while fitting fewest parameters being computationally efficient. Experiments citation including Pubmed, Cora, Citeseer demonstrate proposed model improves over SGC 2.8, 3.3, 2.1% respectively keeping similar stability, achieves better performance in terms accuracy stability other baselines.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multi-Neighborhood Convolutional Networks

We explore the role of scale for improved feature learning in convolutional networks. We propose multi-neighborhood convolutional networks, designed to learn image features at different levels of detail. Utilizing nonlinear scale-space models, the proposed multineighborhood model can effectively capture fine-scale image characteristics (i.e., appearance) using a small-size neighborhood, while c...

متن کامل

Edge Attention-based Multi-Relational Graph Convolutional Networks

Graph convolutional network (GCN) is generalization of convolutional neural network (CNN) to work with arbitrarily structured graphs. A binary adjacency matrix is commonly used in training a GCN. Recently, the attention mechanism allows the network to learn a dynamic and adaptive aggregation of the neighborhood. We propose a new GCN model on the graphs where edges are characterized in multiple ...

متن کامل

Single Image Dehazing via Multi-scale Convolutional Neural Networks

The performance of existing image dehazing methods is limited by hand-designed features, such as the dark channel, color disparity and maximum contrast, with complex fusion schemes. In this paper, we propose a multi-scale deep neural network for single-image dehazing by learning the mapping between hazy images and their corresponding transmission maps. The proposed algorithm consists of a coars...

متن کامل

Event Extraction via Dynamic Multi-Pooling Convolutional Neural Networks

Traditional approaches to the task of ACE event extraction primarily rely on elaborately designed features and complicated natural language processing (NLP) tools. These traditional approaches lack generalization, take a large amount of human effort and are prone to error propagation and data sparsity problems. This paper proposes a novel event-extraction method, which aims to automatically ext...

متن کامل

3D mesh segmentation via multi-branch 1D convolutional neural networks

There is an increasing interest in applying deep learning to 3D mesh segmentation. We observe that 1) existing feature-based techniques are often slow or sensitive to feature resizing, 2) there are minimal comparative studies and 3) techniques often suffer from reproducibility issue. This study contributes in two ways. First, we propose a novel convolutional neural network (CNN) for mesh segmen...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Ambient Intelligence and Humanized Computing

سال: 2021

ISSN: ['1868-5137', '1868-5145']

DOI: https://doi.org/10.1007/s12652-021-03355-x